Selective memory loss in aphasics: An insight from pseudo-recurrent connectionist networks
نویسنده
چکیده
McClelland, McNaughton, O’Reilly [15] suggest that the brain’s way of overcoming catastrophic interference is by means of the hippocampusneocortex separation. French [8] has developed a memory model incorporating this separation into distinct areas, using pseudopatterns [23] to transfer information from one area to the other of the memory. This network gradually produces highly compact representations which, while they ensure efficient processing, are also highly susceptible to damage. Internal representations of categories must reflect the variance within the categories. Because the variance within biological categories is, in general, smaller than that in artificial categories and because memory compaction gradually makes all representations proportionately less distributed, representations of lowvariance biological categories are likely to be the most adversely affected by random damage to the network. This may help explain the selective memory loss in aphasics of natural categories compared to artificial categories.
منابع مشابه
Using pseudo-recurrent connectionist networks to solve the problem of sequential learning
A major problem facing connectionist models of memory is how to make them simultaneously sensitive to, but not disrupted by, new input. This paper describes one solution to this problem. The resulting connectionist architecture is capable of sequential learning and exhibits gradual forgetting where standard connectionist architectures may forget catastrophically. The proposed architecture relie...
متن کاملA Novel Approach to On-Line Handwriting Recognition Based on Bidirectional Long Short-Term Memory Networks
In this paper we introduce a new connectionist approach to on-line handwriting recognition and address in particular the problem of recognizing handwritten whiteboard notes. The approach uses a bidirectional recurrent neural network with long short-term memory blocks. We use a recently introduced objective function, known as Connectionist Temporal Classification (CTC), that directly trains the ...
متن کاملPseudo-recurrent Connectionist Networks: An Approach to the 'Sensitivity-Stability' Dilemma
In order to solve the “sensitivity-stability” problem — and its immediate correlate, the problem of sequential learning — it is crucial to develop connectionist architectures that are simultaneously sensitive to, but not excessively disrupted by, new input. French (1992) suggested that to alleviate a particularly severe form of this disruption, catastrophic forgetting, it was necessary for netw...
متن کاملLearn more by training less: systematicity in sentence processing by recurrent networks
Connectionist models of sentence processing must learn to behave systematically by generalizing from a small training set. To what extent recurrent neural networks manage this generalization task is investigated. In contrast to Van der Velde et al. (Connection Sci., 16, pp. 21–46, 2004), it is found that simple recurrent networks do show so-called weak combinatorial systematicity, although thei...
متن کاملComputational modeling of dynamic decision making using connectionist networks
In this research connectionist modeling of decision making has been presented. Important areas for decision making in the brain are thalamus, prefrontal cortex and Amygdala. Connectionist modeling with 3 parts representative for these 3 areas is made based the result of Iowa Gambling Task. In many researches Iowa Gambling Task is used to study emotional decision making. In these kind of decisio...
متن کامل